Coding theory

Results: 1371



#Item
191Finite fields / XTR / Coding theory / Group testing / Vacuous truth / Mathematics / Logic / Combinatorics

Slide 2.11, 2.14 & 3.2 MATH HISTORY QUESTION EXERCISE ONE My last math course was (course, year, and school): I would say that my experience in that course was:

Add to Reading List

Source URL: bridge2success.aacc.edu

Language: English - Date: 2012-12-13 14:53:37
192Coding theory / Probability theory / Tree / Bipartite graph / Bridge / Binary tree / Graph theory / Mathematics / Belief propagation

C:/Users/Shu/Desktop/Distributed spectrum allocation/paper/IEEEtran5/WCNC-Final.dvi

Add to Reading List

Source URL: www.cse.unt.edu

Language: English - Date: 2013-08-12 14:12:50
193Error detection and correction / Finite fields / Thermodynamics / Continuum mechanics / Coding theory

An Introduction to HCD 1 An Introduction to Human-Centered Design

Add to Reading List

Source URL: cemusstudent.se

Language: English - Date: 2015-05-04 11:48:38
194Statistics / Entropy / Algorithmic information theory / Noisy-channel coding theorem / LZ77 and LZ78 / Kolmogorov complexity / Kullback–Leibler divergence / Complexity / Huffman coding / Information theory / Information / Theoretical computer science

INSTITUTE OF PHYSICS PUBLISHING Eur. J. PhysS69–S77 EUROPEAN JOURNAL OF PHYSICS doi:/26/5/S08

Add to Reading List

Source URL: samarcanda.phys.uniroma1.it

Language: English - Date: 2007-11-30 04:52:45
195

1996 Paper 9 Question 11 Information Theory and Coding Consider a noiseless analog communication channel whose bandwidth is 10,000 Hz. A signal of duration 1 second is received over such a channel. We wish to represent

Add to Reading List

Source URL: www.cl.cam.ac.uk

- Date: 2014-06-09 10:17:13
    196

    2009 Paper 9 Question 9 Information Theory and Coding (a) Calculate the entropy in bits for each of the following random variables: (i ) Pixel values in an image whose possible grey values are all the integers from 0 to

    Add to Reading List

    Source URL: www.cl.cam.ac.uk

    - Date: 2014-06-09 10:18:33
      197

      2009 Paper 7 Question 11 Information Theory and Coding (a) Let X and Y be discrete random variables over state ensembles {x} and {y} having probability distributions p(x) and p(y), conditional probability distributions

      Add to Reading List

      Source URL: www.cl.cam.ac.uk

      - Date: 2014-06-09 10:18:30
        198

        1994 Paper 8 Question 5 Information Theory and Coding Define the Fourier Series of the periodic function f (x) with period 2π, giving formulae for the Fourier Coefficients. [5 marks]

        Add to Reading List

        Source URL: www.cl.cam.ac.uk

        - Date: 2014-06-09 10:17:00
          199

          2007 Paper 8 Question 7 Information Theory and Coding (a) Suppose that X is a random variable whose entropy H(X) is 8 bits. Suppose that Y (X) is a deterministic function that takes on a different value for each value o

          Add to Reading List

          Source URL: www.cl.cam.ac.uk

          - Date: 2014-06-09 10:18:20
            200

            1997 Paper 8 Question 11 Information Theory and Coding The input source to a noisy communication channel is a random variable X over the four symbols a, b, c, d. The output from this channel is a random variable Y over

            Add to Reading List

            Source URL: www.cl.cam.ac.uk

            - Date: 2014-06-09 10:17:17
              UPDATE